![]() SYSTEM FOR TRACKING THE LOOKING POINT OF AN OBJECT WATCHING AN OBJECT AND METHOD FOR TRACKING THE LO
专利摘要:
system for tracking the gaze of an observer on an object and method for tracking the gaze of an observer observing an object. a system for tracking the point of view of an observer observing an object comprises a camera for recording an image of an eye of the observer, comprises means for providing a luminous marker, and means for analyzing the image of the eye to determine the reflection of the marker on the eyes and the center of the pupil. the relative positions of the marker's corneal reflex and the center of the pupil are mediated. the marker is repositioned, depending on the relative positions determined, to improve the match between the marker's corneal reflex and the center of the pupil. 公开号:BR112012023787B1 申请号:R112012023787-0 申请日:2011-03-15 公开日:2021-08-10 发明作者:Frederik Jan De Bruijn;Harold Agnes Wilhelmus Schmeitz;Karl Catharina Van Bree;Tommaso Gritti 申请人:Koninklijke Philips N.V.; IPC主号:
专利说明:
FIELD OF THE INVENTION The invention is related to a system for tracking the point of view of an observer observing an object, in which the system comprises a device for recording an image of an observer's eye, comprises means for providing a luminous marker on or associated with the observed object , and means for determining, from the image, a position of a reflection of the marker's cornea and the center of the pupil. The invention is further related to a method for tracking the point of an observer's gaze observing an object, wherein the method comprises recording an image of an observer's eye, providing a light marker on or associated with the observed object, and determining, the from the image, a position of a reflection of the marker's cornea and the center of the pupil. HISTORY OF THE INVENTION A system and method of the above type is known from US patent application US 2006/0110008. In this document, a number of markers are provided and at least one marker of which the corneal reflection is within a threshold distance from the pupil center is identified, or at least two markers having a corneal reflection close to the pupil center. The identified marker(s) is (are) indicative of the observer's point of view of the object. Continuous measurements of viewing direction, or also called gaze, are often called "gaze tracking" - "gauze-tracking" (often the more ambiguous term "eye tracking" - "eye-tracking" is used). There are several methods for performing eye tracking. Video capture has proven to be a viable option for truly unobtrusive and remote looking tracking. However, virtually all eye tracking systems require a user-specific calibration, after which only very little user head movement is allowed. Consequently, these systems are confined to desktop use, and are not suitable for consumer applications. In consumer applications, a user-specific calibration is not a realistic option. Calibration-free gaze tracking is widely pursued by various research centers, often extrapolating from existing concepts (more cameras, more processing). The observer can be a human, whether adult or child, but also an animal. Document US 2006/0110008 attempts to remedy the problem and provide a system and method free of calibrations. In the situation where the eye is looking at a light source, its reflection in the cornea appears to coincide with the center of the pupil. The system and method of document 2006/0110008 provides a number of markers on the object. In the recorded eye image, the marker that has a corneal reflection within a threshold distance from the center of the pupil, or a number of markers having a corneal reflection close to the center of the pupil, is identified. These are used to estimate the direction of gaze. However, the known system has a number of disadvantages: - each marker must be labeled. This condition requires that a number of video frames be analyzed before a marker identification can be made. This introduces latency in gaze tracking. Given the mobility of the eye, temporal latency quickly gives rise to motion artifacts. - Where a marker is used, there will always be an inaccuracy equivalent to the threshold distance in the established gaze direction accuracy. Where more than one marker is used, interpolation is required. This takes computational power and introduces inaccuracies. SUMMARY OF THE INVENTION It is an object of the invention to provide a system and method in which the speed and/or accuracy of eye tracking is improved. To this end, the system is a system comprising means for determining a difference in position between the corneal reflection of the marker and the center of the pupil to provide a difference signal and means for changing the position of the marker on or associated with the observed object depending on of the difference signal to make the corneal reflection of the marker and the pupil center substantially coincide, to update the position of the marker to substantially match the gaze point. The method is a method in which a difference between the corneal reflection of the marker and the center of the pupil is determined to provide a difference signal and the position of the marker on or associated with the observed object is changed depending on the difference signal to be made with that the corneal reflection of the marker and the center of the pupil substantially match, to update the position of the marker to substantially match the gaze point. In the system and method of the invention, the temporal latency is reduced, since the marker is not a static marker, but is repositioned from the feedback of measurements of the relative positions of the marker's corneal reflex and the center of the pupil. The marker is repositioned, i.e. the position of the marker on or associated with the observed object is changed, to improve the match between the corneal reflection of the marker and the center of the pupil, preferably to make the corneal reflection of the marker e the center of the pupil match or nearly match. When the corneal reflex and pupil center coincide, the gaze direction can be precisely determined, as the position of the marker over or associated with the observed object will then indicate the gaze direction to a high degree of accuracy. The observer can be, and in many circumstances will be a human, however, within the method and system according to the invention, the observer can also be a non-human observer, for example, animals in animal studies so that, by for example, studying the look of such animals as dogs or primates. The system and method provides, over previous methods, a number of advantages: The system and method does not require user-dependent calibration, in contrast to many current solutions. - The system and method is based on an optical property of the eye that is largely independent of head position and orientation and virtually invariable between different individuals. The system and method allows for freedom of head movement far beyond existing eye tracking systems. This is an essential step to enable the use of look-tracking for consumer applications. - Regarding the system and method of document US 2006/0110008, there is no need to label separate tags or groups of tags with unique identifiers. The known system uses temporal light modulation to encode the different identifiers, which introduces a temporal latency (expressed in a number of frames) that is proportional to the code length. The system and method of the invention responds much more quickly, and the feedback system has been shown to converge faster, often within a single frame delay. Compared with the system and method of document US 2006/011008, the correspondence between the corneal reflection of the marker and the center of the pupil can be greatly improved, reducing inaccuracies. By making the marker reflex and pupil center coincide, any inaccuracy due to the threshold value in US 2006/0110008 document is reduced or minimized. The system and method, according to the invention, uses means to change the position of the marker in response to the difference in the positions of the corneal reflex and the center of the pupil in the recorded image. Although adaptive marker repositioning provides a method and system that is a little more complex than the known system and method, which uses fixed markers, the advantages are significant. In embodiments, the system comprises a display device, and the position of the marker is related to the display device. There are several ways to relate the marker's position to a display device (for example, a TV or monitor). One way is to light a marker on a display screen of the display device, onto a transparent screen placed in front of the display screen, or, in preferred embodiments, the means for providing a marker and the display device are integrated into a single device. The display device itself, in embodiments, comprises means for providing a marker in an image on a display screen of the display device. This can, for example, be achieved by hiding a marker in the displayed visible image. Hiding, which means that the marker is indistinguishable to an observer, can be done by providing the marker with a spectral signature and/or temporal and spatial or temporal and spatial signature combined, so that a camera tuned to such a signature is able to detect the marker while the observer is not. A non-noticeable marker can be provided using a marker in a part of the light spectrum outside the human range. The marker is then invisible to the human eye. Infrared -IR (IR - "Infrared")- and ultraviolet (UV) markers are examples of such markers. Even a marker in the visible range can become undetectable to an observer if it cannot be seen by an observer. For example, a marker can be provided by forming an image sequence, where the marker is hidden from a human or animal eye but is distinguishable to a camera tuned to the marker. In such an embodiment, the marker has a time signature. In embodiments, the marker is detectable to the human eye. In realizations, it may be advantageous to provide a human observer or a third person with directly visible information about the direction of gaze. In many cases, however, it is preferred that the marker be imperceptible to the observer, to avoid oscillatory behavior. If the marker is noticed, the observer's eyes may try to track the marker, this is difficult to avoid as an observer's gaze can be drawn to the marker. Since the mobile marker is following the eye movement, through system feedback, and the eye may be, without the observer noticing, trying to follow the marker, an oscillatory behavior could occur. If necessary, such oscillatory behavior, however, can be suppressed by introducing an indirect coupling between the measured point of gaze (POG - "Point of Gaze") and the visually displayed POG (typically a mouse cursor). Such indirect coupling can be achieved through several methods known in the field of control engineering. An example of a linear solution is the use of a proportional derivative integral (PID) element in the control loop. An example of a non-linear solution is the use of a deadband, to create hysteresis between the measured and displayed POG. The display device and the means for providing a marker can be integrated into a single device as stated above. Integrating the display device and the means to provide a marker in a single device has the advantage that the correlation between the visible information and the marker is precisely known. A preferred example is a system comprising a liquid crystal display (LCD) which has an IR backlight and a visible light backlight. The LCD device is then able to provide both a visible image and an invisible IR marker on the display screen. Another example is a system comprising a multi-primary display device. Such devices are also capable of providing a marker on an image displayed on the display screen. The display device is, in embodiments, an image projector device. Alternatively, the display device and the means for providing a marker may be separate. The means for providing a marker can, for example, be formed by a transparent shield positioned in front of an image screen of the display device. Inside the shield, a marker is generated or, on top of the shield, a marker is projected through a projector. In preferred embodiments where a transparent shield is used in front of a display screen, the transparent shield is provided with means for determining the shield's positioning relative to the display screen of the display device. The marker can also be projected directly onto a display screen. Using separate devices has the advantage that a standard display device can be used within the system without many additional adjustments. In other embodiments, the system comprises means for providing the marker and establishing the position of the marker, and repositioning the marker directly on a backdrop or within or on a transparent plate in front of a backdrop. In such embodiments, the system does not comprise a display device. Such a system is, for example, useful for monitoring the focal attention of a person driving a car or truck. For example, in a preferred embodiment, the transparent plate is a vehicle windshield, and on the inside of the windshield, an invisible IR marker is generated, thus maintaining an unobstructed view towards the road ahead. . The driver's gaze is tracked, and if the tracking shows that the driver is in danger of falling asleep or losing attention, an attention signal is given to alert the driver. The projection of a marker onto a backdrop can, for example, be advantageously used to track an observer's gaze through a shop window on displayed goods. In embodiments, the system comprises means for providing more than one marker and means for identifying the markers and coupling the markers to one or more observer eyes. The advantage of this realization of the system is that latency only increases as the number of tracked observers increases. BRIEF DESCRIPTION OF THE DRAWINGS These and other objects and advantageous aspects will become more apparent from exemplary embodiments which will be described using the following Figures. Figure 1 illustrates the effect of specular light reflection; Figure 2 illustrates the steady reflection of a fixed light source as the eye moves. Figure 3 illustrates a basic gaze tracking configuration Figure 4 illustrates a typical video frame from a current eye tracking system Figure 5 illustrates the coincidence of the marker reflex and the center of the pupil when the observer is looking at the marker. Figure 6 illustrates the difference between the method of document US 2006/011008 and the method of the invention. Figure 7 illustrates a system of the invention Figure 8 illustrates the basic principle of marker movement from feedback. Figure 9 illustrates a method for finding the center of the pupil. Figure 10 illustrates an example of a system according to the invention. Figure 11 illustrates the relative transmission power T of a liquid crystal material generally used in the visible light region 101 and the near infrared light region 102 as a function of wavelength. Figure 12 illustrates a further embodiment based on the use of a light guide diffuser plate, positioned in front of a screen and illuminated by peripheral segmented IR illumination. Figure 13 illustrates a system comprising an IVedge screen. Figure 14 shows how a pattern marker, in the example a cross, is visible as a reflection in the eye. Figure 15 illustrates a pseudorandom IR light pattern as a marker Figure 16 illustrates the use of two separately acquired IR targets and their appearance over the cornea as a way of estimating two proportionality constants, which serves as a first-order (linear) approximation of the real relationship. Figure 17 illustrates a multi-observer method and system. Figure 18 illustrates an embodiment in which the projector projects an IR target onto a scene. Figure 19 schematically illustrates the use of a transparent OLED board 191 in front of a screen 192. Figures are not drawn to scale. In general, identical components are designated by the same reference numerals in the figures. DETAILED DESCRIPTION OF PREFERRED ACHIEVEMENTS Figure 1 illustrates the effect of specular light reflection. The eyelids 1 surround the eye 2. The iris 4 comprises a pupil 3. The corneal reflection of a light source is schematically indicated by 5. Figure 2 illustrates that the corneal reflex, or 'flicker', resulting from a fixed light source, has a tendency to roughly maintain a fixed position as the eye rotates, changing the direction of gaze. The reflection from the fixed light source tends to maintain a fixed position when the eye rotates to change the direction of gaze. The 'x' marker indicates the center of pupil displacement as the subject changes the direction of gaze... This stable relative 'rotation invariant' flicker position forms the basis of virtually all commercially available gaze tracking systems that utilize video and, for example, an IR light source. The diagram in Figure 3 shows how the projection of the scintillation and pupil center can be found by constructing the associated light rays. Cam indicates the camera, 1st light source, c corneal centers, p pupil center, r pupil center refraction, q light source reflection, F camera focal length, v image of pupil center at camera, an image of the corneal reflection from light source 1 on the CAM camera. The estimated direction of gaze, here represented by the o optical axis, is based on checking the center of the pupil captured against one or multiple flicker positions. Since the pupil is essentially a uniform circular disc, some image processing is required to obtain the center of the pupil from the usually elliptical projection of the pupil disc. Note that refraction occurs at the interface between air and the aqueous domain in the eye. In Figure 4, a video frame of an eye observed with a normal camera is shown. The four flickers A, B, C and D over pupil P are reflections of four IR LEDs A' , B' , Cz and D' on the corner of a screen. The estimated direction of gaze g over monitor M is shown in Figure 4. The accuracy with which the gaze direction g can be obtained with this approach is relatively high. However, the method has major disadvantages: - Calibration is required to translate the position of the center of the pupil, relative to each flicker position, to a gaze direction or a point-of-eye on a display screen. - As the relationship between the gaze point and this relative pupil position assumes stationary light sources, the tracker becomes very sensitive to changes in head position. - Calibration is necessary for each new individual to deal with small differences in physiology (corneal surface curvature, pupil placement along the optical axis). The relationship between the projection of the center of the pupil and the reflection of light has a remarkable property. In the situation where the eye is looking at a light source, its reflection in the cornea appears to coincide with the center of the pupil. This situation is sketched in Figure 5. Note that this literal 'coincidence' of scintillation and pupil center is not a trivial situation. It turns out that the value of the refractive index of the eye and the placement of the pupil behind the corneal surface cause this favorable optical phenomenon. To some extent, this is used in the system and method of document US 2006/0110008. The known system uses a relatively large number of markers to find a marker that is within a threshold distance from the center of the pupil. The known system and method, however, has a number of disadvantages: The known system uses temporal light modulation to encode the different identifiers, which introduces a latency (expressed in a number of frames) that is proportional to the code length. Accuracy is determined by the threshold distance. Either make the threshold distance small for high precision, which requires a high number of markers and thus a high temporal latency, or accept a relatively large threshold distance, to reduce the temporal latency, however, the accuracy of the determination the direction of gaze is compromised. Figure 6 illustrates the difference between the system and method of document US 2006/0110008 and the system and method of the invention, in which the left-hand scheme illustrates the known system and method, and the right-hand scheme the system and method of according to the invention. In the known system and method, the markers form a fixed pattern covering the field of gaze. In the known system, schematically illustrated on the left side of Figure 6, temporal modulation is used to label each IV marker or marker group using temporal modulation, for example using binary temporal coding. This means that to encode n different IV markers or groups of markers in binary, a code length of at least m=log2(n) bits is required. Consequently, m video frames are needed to identify all encoded IR markers or groups of markers. There are several disadvantages: - Gaze tracking over, for example, 16 x 16 uniquely labeled IR markers requires the identification of 256 different encoded labels, so a latency of 8 frames is already required for a relatively coarse gaze estimate. The finer the gaze estimate, the more markers to be used, and the longer the temporal latency becomes. - Given the mobility of the eye, temporal latency quickly gives rise to motion artifacts. - In the application for eye guided cursor control, the gaze tracker response should be instantaneous, with minimal latency. The system and method of the invention, schematically shown on the right side of Figure 6, differs from the known system and method in that the position of the light source(s) is not fixed, but dynamically adjustable. To establish 'pupil-scintillation coincidence', a movable marker is used, such that the dynamically movable marker reflex is maintained in the center of the pupil through a control mechanism. The relative positions of the pupil center and marker reflexes are detected. From the relative positions of the pupil center and the corneal reflex of the marker, a new marker position is computed. The new position is computed to make the marker's corneal reflex coincide with the center of the pupil. This is schematically indicated in Figure 6 by a repositioning signal S. Using repositioning signal S, the marker is repositioned. It is noted that the 'repositioning signal' can be any data in any shape by which the marker can be repositioned. The result of repositioning is that the gaze direction is always along the line, or at least very close to a momentary marker position. Because the marker's position is continuously moved to make its reflection match the center of the pupil, the marker's physical position is updated to match the gaze. When the feedback loop is properly closed, the system continually provides the coordinates of the point-of-eye, which are identical, or at least very close to the coordinates of the marker. - Figure 7 illustrates a system according to the method. The system comprises a camera 71 (or more generally: a means for obtaining an image). The camera in operation records an image of an eye 72 of an observer. The system comprises means 73 for providing a movable light marker 74 in the observer's field of view. This light source acts as a marker. The position of the marker within the observer's field of view can be dynamically adapted. To this end, the system comprises means for adapting (75) the position of the marker. Within the concept of the invention, a light source is any medium that produces light rays that can be reflected by the eye. Image data is analyzed in an image analyzer (76) . The image analyzer establishes the u-v difference of the corneal reflex of the light source (u) and the pupil center (v), i.e. data on the relative positions of the pupil center and the corneal reflex of the marker. The analyzer directly calculates the data for repositioning the marker, or the analyzer provides data for the relative position of the marker reflex and the center of the pupil, and an additional analyzer provides, based on the data about the relative positions, data for repositioning the marker. This data (or data derived from this data) can be in the form of a repositioning signal, i.e. a signal which is, further down the system chain, used to reposition the marker. The signal is sent to means (75) for adapting the positioning of the marker (74) to reduce the distance between the corneal reflex and the center of the pupil. The analysis means and the means for adapting can be combined in a control unit C. The displacement Δ, i.e. the repositioning, that the marker 74 makes is a function of the difference u-v in the recorded image. The image produced after the repositioning of the marker is again analyzed and the process of adapting the position of the marker is repeated, if necessary, until the corneal reflection of the marker matches, with considerable precision, the center of the pupil. In this example, a single marker 74 is shown; within the concept of the invention, more than one mobile marker can be used to simultaneously track the gaze of several observers. The corneal reflex is the reflection on the outer surface of the viewer's lens. In case the viewer wears a contact lens, the outer surface of the contact lens is considered to form the outer surface of the viewer's lens. The invention is based on the understanding that it is advantageous to establish a connection between the position of the marker and the point-of-gaze through a feedback control system. A control system adjusts the position of the marker until its reflection in the cornea coincides or nearly coincides with the center of the pupil. The difference between the position of the corneal reflex and the center of the pupil forms, or is at least the basis of, a feedback signal that is used to reposition the marker. It has been found that the temporal latency of a system according to the invention is much lower than that of the known system. Also, it is more accurate since there is no fixed threshold value. Instead of fixed marker positions, the marker position is adjusted in dependence on a feedback signal derived from a measurement of the relative positions of the marker's corneal reflex and the center of the pupil. In the system and method of the invention, latency is reduced as the marker is allowed to move to coincide with the center of the pupil. The basic principle of marker movement from feedback is illustrated in Figure 8. This figure shows the eye and an IV 'screen' during a sequence of events. Note that the screen pattern is mirrored in relation to the reflected pattern. The center of the pupil is given by an "x", the corneal reflection of the marker by a w I n Part A illustrates an IV marker in an initial position; the pupil also starts at an initial position. In part B, the IV marker is moved until its corneal reflex matches the center of the pupil (from dotted '+' to solid '+'). In part C, the observer changes the direction of gaze, and the pupil adopts a new position (from dotted 'x' to solid 'x'). In part D, the IV marker is again moved to keep its corneal reflex corresponding to the center of the pupil, again directly indicating the direction of gaze. The invention can be used in a wide variety of embodiments. Below, some non-restrictive examples are given. In several of the examples, the marker, for the sake of simplicity, will be described as an infrared (IR) marker. Although the use of IR markers forms preferred embodiments, it is explicitly stated that other types of light markers, such as UV markers, or markers in the visible light region, can be used. The system has a control system that contains: - a position comparison operation (in Figure 7 shown as means of analysis) that compares the position of the corneal light reflex of the marker, eg an IR marker, with the apparent position of the center of the pupil; - a position calculation operation that calculates a new position of the IV marker, or the difference between the present position and the ideal position with the aim that the new or adjusted position will cause the reflection of the IV marker and the position from the center of the apparent pupil coincide followed by; - an adaptation of the position of the marker according to the new calculated position; Data to adapt position is also called "repositioning signal". It is noted that, within the method and system, a feedback loop is used, so an initially less than ideal position calculation will have the effect that the marker reflex, after being repositioned, will become closer to the center of the pupil, but not in the most effective way. A next calculation step will then be followed to improve the match. In preferred embodiments, the position calculation operation parameters are adjustable to fine-tune the calculation to improve accuracy so that as few iterative steps as possible are required. This will improve the speed of calculations, and thus reduce the latency of the feedback loop. Calculations can be performed step-by-step or in a single operation. Figure 9 illustrates a method for finding the center of the pupil. Pupil segmentation can be aided by the use of light sources close to the camera, positioned as close as possible to the optical axis of the camera. Although this solution is not new and widely applied in conventional eye tracking, it can also be used in the proposed new system. When this 'coaxial illumination' is active, an IR 'red-eye' effect is created which causes the retina to light up in the pupil, as shown in Figure 9. Pupil disk follows from the difference between two images, one with and one without coaxial lighting. Figure 9 schematically shows an eye image captured with a single line of sight intensity profile with a) illumination from non-coaxial sources; b) lighting from a coaxial source; c) differential image, b minus a. Pupil center estimation can be performed using a wide variety of algorithms, the cheapest of which is the calculation of the center of mass of the pupil disk projection. The invention can be carried out in a variety of ways in a variety of systems. The system may comprise a display device and means for providing an on-screen marker. The means for providing a marker on the screen can be integrated into the display device. The means for providing a marker may also be separate from the display device. An example of an embedded device is one in which the display device provides both the image and the marker. An example of a means for providing marker separately is a device which projects a marker onto a display screen or a device comprising a separate transparent shield positioned in front of a display screen on which an IR marker can be generated. Figure 10 illustrates an example of a system according to the invention. A first realization is based on the use of a 102 LCD screen that has an IR (IR) backlight in addition to the normal visible backlight (RGB) as illustrated in Figure 10. The implementation and control of the system , using the C control unit is in principle similar to that of backlight-tracking LCD systems and particularly color-sequential LCD systems. The video camera (cam) is synchronized with the backlighting system. Synchronization causes the backlight to switch between IR and visible light. During the visible light phase, the normal image is shown. During the IR phase, the LCD panel 102 displays a pattern that serves as the mobile IR marker 103; during this phase, the camera (CAM) captures an image of eye 101 with the marker reflected. The center of the pupil can be obtained using the method discussed above, for example, during the visible light phase. Experiments have shown that liquid crystal material is able to modulate near-infrared light in the same way that it modulates visible light. Figure 11 illustrates the relative transmission power T of a liquid crystal material generally used in the visible light region 111 and the near infrared light region 112 as a function of wavelength. The transmission curve T for liquid crystal material in the open state is such that visible light, as well as near IR light, is transmitted. In the closed state, as illustrated by curve C, visible light, like IR light, is not transmitted. Figure 11 shows that a liquid crystal display is capable of modulating IR light and thus creating a mobile IR marker. The IR light source can also be lit continuously, as the human eye does not see much of the near-infrared light anyway. Synchronization is, however, preferred as it provides a possibility to more easily distinguish the IR marker source IR light from other spurious IR light reflections. The CAM camera records the image, the control system C (or a calculator inside the cam camera) calculates the difference between the corneal reflex of the IR marker and the center of the pupil, it is calculated for which position on the LCD screen the marker is must move to coincide with the center of the pupil, the IV marker is repositioned, and the process is repeated if necessary. In this example, the means for providing a marker is, to a high degree, integrated into the display device. A second embodiment of IR marker overlay is to use spatially segmented IR backlight in addition to visible backlight. By choosing the wavelength of IR light far enough away from the visible range, it was found that the liquid crystal material becomes transparent to IR light, regardless of the open or closed state of the pixels. The advantage is that the visible backlight can remain active, and the liquid crystal panel can be used exclusively for the normal visible image. Now the resolution of the IR marker is that of the segmented IR backlight. The segmented IR backlight is controlled to control the position of the IR marker. In this example, the means for providing a marker is also integrated into the display means, but to a lesser degree than in the first example. Alternatively, the means for providing a marker can be formed by a projector, which projects an IR beam, through a system of moving mirrors on the liquid crystal display device. In this example, the means for providing a marker is much more separate from the display device. Going one step further would be to provide a transparent plate on which a movable IR marker is provided. This can be done, for example, by projecting a marker onto a transparent screen. The image to the viewer is then the field of view through the transparent plate. Starting from the LCD screen, a marker can also be provided as follows: An image is displayed according to the image information provided to the screen. At the marker position, the intensity (or, for example, the intensity of one of the colors) is reduced by a small amount. In the next frame, the intensity at the marker position is increased by the same amount. If this intensity addition or subtraction is performed at a high enough frequency, the human eye cannot perceive any flicker due to changes in intensity. However, a camera synchronized to the marker frequency can, by subtracting two sequential images, find the marker position. The advantage of this embodiment is that it can be used in and for any existing display device, as long as the display device has a high enough image frequency. Most modern devices are capable of accomplishing this feat. It is not necessary to provide an IV source to provide an IV marker. In this embodiment, the marker is hidden in the visible signal, indistinguishable to an observer, but distinguishable to a camera tuned to the marker's time signature. The downside is that on parts of the screen that are nearly black, it could be difficult to add and subtract intensity. However, in any case, the human eye tends to look at parts of the screen with the highest intensity or at least with high intensity. The fact that at least two frames are needed to find the marker is a disadvantage, but screens are being provided with increasing image frequencies. Alternatively, the marker can be provided with a spatial signature that is not perceptible to an observer, but rather to a camera tuned to the spatial signature. An alternative method of providing hidden markers using known display devices is possible if the device is capable of making the same color in two different ways. Devices called RGB-W are known. Such display devices comprise red, green and blue pixels as well as white pixels. This allows white areas in the image (or indeed within a wide range of colors) to be done in two different ways. The simplest example is a device in which a white area can be made up of a combination of red, green and blue pixels as well as white pixels. To the human eye, the difference between "RGB"-white and "white"-white areas is invisible. However, in the IR and/or UV regions, or in a particular region of the visible light spectrum, the difference is detectable. The screen does not change and the information displayed does not change either. However, by using the possibility that white light can be done in two different ways, it is possible to provide a marker in the image that is visible to a camera, for example an IR or UV camera, or a visible light camera with a color filter in front of it, but invisible to the observer. The marker is then distinguishable by the spectral signature, signature meaning distinct feature. This embodiment can, for example, be advantageously used to track the gaze of a human observer reading a text. Texts are composed of relatively small black letters on a white background. In addition, you can track the human eye by following an animated figure on a white background. The same type of method can be used for any device that is capable of providing a color in two different ways. Examples include, for example, so-called sequential spectrum screens, also called spatiotemporal hybrid color screens. These screens combine conventional LCD screens and sequential color LCD screens. Such a screen conventionally has addressable elements with two color filters (wideband) (eg magenta and cyan) and two types of backlight color fields (eg cyan and yellow), although other combinations of fields Color filters and color filters can be used, for example: (1) magenta and cyan color filters with yellow and blue color fields, and (2) magenta and green color filters with yellow and cyan color fields. A further embodiment is based on the use of a light guide diffuser plate, which is positioned in front of a screen and illuminated by peripheral segmented IR illumination. This embodiment is illustrated in Figure 12. A diffuser light guide plate 122 is used which is positioned in front of the screen 121 and illuminated by the peripheral segmented IR illuminators 123. The transparent diffuser light guide 122 acts as a means for create an invisible IR marker without obstructing free view of the image, screen, or scenery behind it. By using a marker that covers at least two peripheral IR illuminators 123, and by gradually varying the mutual intensity as the IR marker moves into position, the center of the marker can be obtained at a resolution that is less than the distance between devices 123. Thus, the number of devices 123 can be relatively small, while providing a high resolution. A variation of this embodiment is based on the use of a Wedge screen. A Wedge screen essentially comprises a light guide that transfers a focused light field from a projector to a projection screen from consecutive internal reflections, so that geometric integrity of light rays is preserved. Figure 13 illustrates such a device. Figure 13 shows that the light guide can be transparent as the projected light is confined by the internal reflection. The light field finally escapes the guide as the rays cross the material interface at a less than critical angle. In the absence of a front diffuser and a back cover, this technology is suitable for creating a translucent screen that allows a clear view over the screen or the scenery from behind. In a further embodiment, IR illumination is achieved by converting UV and/or visible light to infrared light using fluorescence. The goal is to obtain an efficient way to generate IR light, in situations (typically outdoors) where ambient IR light deteriorates the readability of the IR marker(s) of the camera system. In increasingly bright daylight, the light emitted by the IR marker increases in intensity. The fluorescent IR light source can be thought of as a continuous backlight, so local or global shutters can be used to modulate its intensity. A range of fluorescent pigments is known which can provide a wide variety of (transparent) plastics with a fluorescent property. In this product range, there are also pigments available that convert any visible light to IR light. The pigment is, for example, used in industrial applications such as laser welding of clear plastics. The application of such pigments as a passive IR illuminator in the context of eye tracking is not known. In a further embodiment, the IR marker is projected by a projector (essentially a projector) which only emits invisible IR light to the user. The IR marker is a focused image that is projected onto a normal visual screen, or onto a transparent screen sufficiently diffusing, allowing a transparent view of the scene from behind, or of the scene itself. In a further embodiment, the IR marker image and a normal visual image are generated by two separate screens, from which the images are superimposed using a semi-transparent mirror, using the classic teleprompter or autocue setting. Figure 12 illustrates that the marker need not be a dot. A marker is any point, area, or pattern that can be distinguished as notable. It can be a dot, circle, cross, or other pattern. Several ways to make a bookmark are described above and below. Where an "IV marker" is mentioned below, it should be recognized that a marker may also be in the visible range, or in the UV range. "Mark IV" is used below due to the fact that for many embodiments, the use of an IV marker is preferred. However, this does not form a restriction on the invention in a broader sense. The shape of the IV marker is preferably such that its reflection in the eye is readily captured by a camera and readily detected by machine vision algorithms for the purpose of position tracking. However, the shape of an IV marker can be limited by the way in which the marker is generated and superimposed over the visible image. One realization of a feedback-based gaze tracking system would be based on a dot-shaped marker, from which the reflection becomes a dot-shaped flicker on the cornea. Although most current gaze tracking systems rely on dot-shaped IR sources, the accuracy of position estimation from their reflection is quickly governed by the resolution of the sensor. The marker is preferably comprised of a connected shape, i.e. a pattern. An embodiment of a connected form is already shown in Figure 12. A connected form embodiment is, in the example of Figure 12, based on the use of a cross ('+') pattern consisting of two intersecting lines that span the entire width and height of the target screen. In this case, the gaze point is associated with the intersection point of two lines. An advantage of the cross pattern is that it can be generated using peripheral illumination, as proposed in the realization of Figure 12 above. In Figure 14, it is shown how a cross is clearly visible as a reflection in the eye. Pattern connectivity allows sub-pixel retrieval of intersection coordinates... A further realization is based on the use of a pseudorandom IR light pattern that potentially provides more robust detection with a camera system, as simpler patterns can give rise to false pattern detections. Figure 15 shows such an example. The pattern looks random but is pseudo-random. The camera can recognize the pattern and can find a point within the pattern, for example the center of gravity. This center of gravity then provides the 'marker' position. A complicated pattern can avoid problems with false pattern recognition. A false pattern recognition can, for example, be present when the camera and/or the marker recognition algorithm originally recognizes another IR source (eg, the reflection of a light bulb) as the IR marker. Such misrecognition can be corrected by checking the result of the change in position of the IV marker, as the true IV marker will move, whereas the 'false' IV marker will not move, however under circumstances this can reduce accuracy or the system reaction time. An additional embodiment uses spatial displacement of the IR target to resolve the relationship between the position of the IR target in the real world and the position reflected in the cornea. Since we propose a feedback mechanism to control the position of the IV target to finally match the gaze point, in the absence of a model, the gaze point is typically encountered in consecutive iterative steps. Each iteration step requires a new video frame, showing the reflection of a new IV target position. The following method is intended to minimize the number of iterations and therefore the latency in the response. The relationship between the coordinates and the position reflected on the cornea is not a linear transformation (affine or perspective) due to the curvature of the cornea. Figure 16 below illustrates the use of two separately acquired IR targets and their appearance as reflections on the cornea as a way of estimating two proportionality constants, which serve as a first-order (linear) approximation of the actual relationship. The two proportionality constants can be defined as Experiments have shown that an estimate of Cx and Cy from two consecutive measurements is sufficient to exactly match the eye-point in the next iteration. By keeping a history of past differences and past values of Cx and Cy, the response to large saccadic movements has been shown to be almost instantaneous. In the absence of a more accurate model, each consecutive step can be based on a range of existing models, such as a bisection. In a further and refined realization, the above method includes a coordinate storage facility to keep the history of actual and reflected target positions, as well as a model of the actual target position and its reflected position. The model allows dealing with the non-linear relationship between real and reflected coordinates, and provides a more accurate prediction of the next best IV marker position to be used in the feedback loop. The invention can be incorporated in a number of different ways. In the above embodiments, use is made of a marker. In embodiments, detection of multiple pairs of eyes indicates that multiple observers are detected; a situation typical of public exhibitions and shop windows. After detecting each additional observer, the system adds an additional marker and updates the synchronization with the camera so that detection and tracking is distributed over time to more observers. This system is illustrated in Figure 17. The system in Figure 17 also allows you to track the gaze of multiple (N) observers by performing IR target presentation, image capture and gaze estimation in a time-sequential manner. Note that the arrows indicate the order over time, not the flow of information. It is noted that although different markers are used, there is a fundamental difference with the known system. Markers are movable and each marker has its own feedback and associated display. The advantage of this system is that latency only increases as the number of simultaneously tracked observers increases. The method and system according to the invention can be used for a wide variety of applications, including: Calibration-free gaze tracking opens up a wide variety of applications; far beyond the traditional fields in which current eye-tracking systems are applied. The absence of calibration means freedom of head movement in relation to the tracking device. The absence of calibration also means 'instantly on' operation for the eye tracking process. In the field of consumer products, calibration-free gaze tracking can have a variety of applications: Control (typically a mouse cursor in combination with a tangible confirmation button: As long as the gaze tracker provides sufficient accuracy, several studies have shown that gaze tracking is more efficient for point-and-click tasks than a conventional computer mouse. The improved reaction speed and accuracy of the method and system of the invention offers great advantage in this field. - For fixed devices, such as a TV (with internet access), the method and system of the invention provides robust cursor control for basic point and click actions, so that one-button internet browsing becomes possible " from the couch". For handheld devices, the solution can free up the other hand, for those operations that currently require the use of a stylus or interaction with a touch screen. Many portable devices already comprise a camera, so the invention can be implemented relatively easily in portable devices. Devices can become "aware" so that they respond to our actions (a button press or a voice command) only when we look at the device. - Communication: There are several methods for transferring eye contact in video conferences between two or more groups, many of which require the point of view of the individuals participating in the conference. In the past, existing eye tracking has proven to be highly ineffective; a concept that never left the lab due to the mentioned limitations of current systems. The improved speed and accuracy of the inventive system and method, when used for multiple groups, increases efficiency. We confirm users' understanding of the relevance of eye contact during video calls. In the field of health and well-being, eye tracking provides a robust way to monitor the level of attention to dangerous tasks. - Control - The look still provides an important user interface for people who suffer from severe lack of motor skills; improving the quality of life by increasing your level of independence and well-being. - The gaze can provide non-contact control in a hospital operating room, reducing a risk of contamination from tactile user interfaces. - Monitoring - The invention provides, in a discreet way, to monitor the level of attention during monotonous dangerous tasks, such as driving a vehicle. One example is monitoring a truck driver's visual attention, using the windshield as a means of superimposing a moving but unnoticeable IR target. - Simulators such as driving or flight simulators. In simulators, it is a great advantage if the trainee's gaze can be monitored quickly and accurately. In some simulators, speed is of higher importance, for example, in flight simulators or Fl racing simulators; a fast and reliable way to track the gaze of the fl driver or airline pilot is of high importance, as important decisions must be made in fractions of a second by such people. - Sports: A lot can be learned from tracking a person in high-speed sports such as tennis, squash, or contact sports. Improved training methods become possible. In the field of retail and digital signage, eye tracking can be used to provide valuable feedback regarding the viewer's focus of attention. Current eye tracking systems have already found their way to advertising agencies for analyzing individual response to online and print advertisements. The improved speed and accuracy of the system and method of the invention will greatly expand the field of operation. Gaze tracking is also being applied to measure collective real-life focal attention to a specific canvas or commercial poster by counting the pairs of eyes looking at them, as well as the duration of their attention. The invention is not restricted to the types of screens shown. An example of another type of screen that can be advantageously used in the method and system of the invention is an OLED screen. Using OLED technology, a plate of transparent material can be made to emit light. OLED technology allows the material to emit IR light, too. The OLED material remains transparent to visible light but emits IR light. Using electrodes, an array of OLED IV pixels can be made into a transparent layer. The OLED IR array acts in the same way as a conventional OLED screen, with the exception that it is transparent to visible light and emits IR addressable pixels. The IR marker is addressed by an active mesh of OLED IR pixels, and the position is variable depending on the IR pixel addressing. Figure 19 schematically illustrates the use of a transparent OLED board 191 in front of a screen 192. The OLED array is addressed using a controller 193. Controller 193 receives information I. Information I is dependent on measurements taken from of the image taken by a CAM camera. The OLED pixel array provides an IR 194 marker over the transparent OLED material. The advantage of a transparent screen, like any screen that can be placed in front of a screen or backdrop, is that it can be used for any existing screen or backdrop. This transparent layer can be used as a canvas in front of a canvas, or in front of a backdrop. The use of a transparent OLED screen has the advantage that the addressing speed of an OLED screen is remarkably high, allowing for even faster look tracking. The invention is not restricted to given exemplary embodiments. For example, when use is made of a projected marker, the marker can be projected onto a display screen, onto a transparent screen through which the viewer watches a scene or a display device, or directly onto the scene. Figure 18 illustrates an embodiment in which projector P projects an IR target onto a backdrop, in this example represented by the center of a cross over the backdrop. In Figure 18, the target is centered over the cylinder. The camera is also displayed. The camera takes an image of the eye and, more particularly, the reflection of the IR target's cornea and the center of the pupil. The invention can be used to track the gaze of more than one person, as explained above. The invention can also be used to track the gaze direction of both eyes of the same person. In said embodiments, an image is obtained of the right and left eyes, and the corneal reflection of one or both eyes is measured and, for at least one of the eyes, a marker is produced. In both-eye realizations, a marker can be used, in which case the look of both eyes can be independently monitored. Alternatively, the gaze of one eye is tracked and, for the other eye, the gaze deviation is measured or estimated. One application is, for example, a device to quantitatively measure the "idleness" of an idle eye. A child is provided with a pair of dual-camera glasses, in which a small camera takes images of the left eye and another of the right eye. Alternatively, the cameras can both be at a distance from the child. It is also possible to use a single camera to record both images. The child is asked to play a video game, in which he must follow a certain target with his eyes. This will happen automatically if he/she must grab an object with the cursor, or take an animated figure through a maze. The movements of both eyes are measured. In a completely unobtrusive manner for both eyes, the eye's reaction time to the motion of the animated figure can be measured, providing a measure for both eye idleness and both eye coordination to movement. In a fun way for the child, eye idleness can be quantitatively measured. The progress of idle eye treatment can be tracked without any discomfort for the child. Another application is to monitor, for both eyes, the movements of a driver looking through a windshield, where, over the windshield, an IR target is provided. In a normal alert state, both eyes will cooperate and make a certain movement pattern to follow what is happening on the road in front of the driver. It is also possible to measure the convergence of the eyes. When the driver is tired, this will show up in eye movements and eye coordination, and a signal can be given for the driver to rest. When the look of both eyes is tracked, it is possible to determine the look of each eye, the difference in looking between the eyes, as well as the average look or the average of both eyes. In such an application, the marker is not projected onto the observed object, the road, or part of the observed image on a display screen, but is still associated with the observed object, where the marker indicates the position of the observed object. The direction of gaze is aligned with the position of the marker on the windshield. The convergence angle of a pair of eyes can be used to estimate the distance from which an observer is looking at an object. This is especially useful in a monitoring situation (for example, monitoring the gaze of a truck driver, or a customer looking at a store window) . System latency is dependent on the number of eyes to be scanned. Latency can be reduced by employing a single marker for each observer (pair of eyes). Averaging two displacement vectors (one for each eye) yields the update vector. The marker position is then updated with this average. Since an observer's eyes will converge (assuming normal vision) toward the same location, any residual error is attributed symmetrically to the eye pair. The amplitude of the displacement vectors as a function of the update vector correlates with convergence. When looking directly at the marker, all vectors will be zero (when the system converges) . When looking at an object further away than the plane (of the projection) of the marker, the horizontal displacement vector will be negative for one eye and equally positive for the other. Another application involves the possibility of extending the principle beyond human or animal eyes to include artificial eyes, for example, in the case of robots. It is well known that the red-eye principle applies to cameras, regardless of the particular lens mounted on them. On the other hand, the coincidence of the reflection of the light source with the center of the camera's pupil is due to the particular reflection index of tissues present in human and animal eyes. It would be possible to design special optical configurations which would maintain such coincidence, while at the same time allowing to obtain the desired optical properties. The application of such special lenses would be manifold. If adopted in the case of robots, it would allow detection of their point of view. A different application would exploit these lenses as markers rather than lenses. In this context, optical elements would be applied on the surface of any object, as markers, allowing the establishment of the object's surface orientation, exploring the same principle used in the described eye tracking system. In short, the invention can be described by: A system for tracking the point of view of an observer observing an object comprises a camera for recording an image of an eye of the observer, comprises means for providing a luminous marker, and means for analyzing the image of the eye to determine the reflection of the marker on the eye and the center of the pupil. The relative positions of the marker's corneal reflex and the center of the pupil are measured. The marker is repositioned, depending on the relative positions determined, to improve the match between the marker's corneal reflex and the center of the pupil. The short description of the method corresponds to the description of the system above. In the claims, any reference signs enclosed in parentheses are not to be construed as limiting the claim. The word "understand" does not exclude the presence of elements or steps other than those listed in a claim. The invention may be implemented by any combination of features from a number of different preferred embodiments as described above. A camera is any device for recording images. It could be part of a device also used for other purposes, such as communicating with, or attaching to, integrated with, or in cooperation with such a communication device. Means for analyzing, means for addressing, means for calculating etc. may be in the form of hardware, software, or any combination thereof. Where use is made of the method and system, any such means may be present nearby either on the camera or on a nearby device or scene to which the camera is connected. However, image data can also be analyzed remotely, just as markers can be controlled remotely. For example, in a store, several cameras can observe different points, each point being, for example, a shop window, where at each point several observers can be tracked. The calculation of the difference between markers and pupil centers and the calculation of the movement to be made by each marker and the signal to be sent to move the markers can be processed in a central computer inside the store, or even if the same is done at various stores across the country, on a central computer at headquarters, or on a central server. The invention is also related to computer programs comprising program code means for performing a method according to the invention when said program is executed on a computer, as well as a computer program product comprising stored program code means on a computer-readable medium to carry out a method according to the invention.
权利要求:
Claims (18) [0001] 1. SYSTEM FOR TRACKING THE POINT OF EYE OF AN OBJECT OBSERVING AN OBJECT, characterized in that the system comprises a device (cam, 71) for recording an image of an observer's eye, a light source (73, 102, 122, 191, 192) configured to provide a luminous marker (74) on or associated with the observed object, said marker (74) being selectively movable within an observer's field of view; and a control unit comprising a first image analyzer (76) configured to analyze the image recorded by the device for recording images, said first image analyzer (76) being configured to determine, from the image, a position of a reflection. of the cornea (q) of the marker (74) over the eye of the observer and the refraction of the cornea (r) of the center of the pupil (p) of the eye of the observer, wherein the system for tracking further comprises: said first image analyzer (76) being configured to determine relative positions of the corneal reflex (q) of the marker (74) and the corneal refraction (r) of the center of the pupil (p); said control unit comprising a second analyzer configured to receive the determined relative positions of the corneal reflex (q) of the marker (74) and the corneal refraction (r) of the center of the pupil (p), and configured to generate data for repositioning the marker (74) to decrease the difference between the determined relative positions of the corneal reflex (q) of the marker (74) and the position of the corneal refraction (r) from the center of the pupil (p) to the corneal reflex position (q) of the marker (74) and the position of the corneal refraction (r) of the center of the pupil (p) substantially coincide, and means for repositioning the marker (74) in response to the repositioning data. [0002] 2. SYSTEM according to claim 1, characterized in that the system further comprises a display device, and the position of the marker (74) is related to the display device. [0003] 3. SYSTEM according to claim 2, characterized in that the display device and the light source for providing the marker (74) are integrated into a single device. [0004] A SYSTEM according to claim 2, characterized in that the display device and the light source for providing the marker (74) are separate devices. [0005] 5. SYSTEM according to claim 4, characterized in that the light source for providing the marker (74) comprises a transparent shield positioned in front of an image screen of the display device. [0006] 6. SYSTEM according to claim 1, characterized in that the light source for providing the marker (74) provides an infrared marker (IR). [0007] 7. SYSTEM according to claim 1, characterized in that the marker has the shape of a pattern. [0008] 8. A SYSTEM according to claim 1, characterized by the light source for providing the marker (74) and resetting means for repositioning the marker (74) providing and repositioning the marker (74) in or on a transparent plate at the front of a scene that includes the object. [0009] 9. SYSTEM according to claim 8, characterized in that the transparent plate is the windshield of a vehicle. [0010] 10. SYSTEM according to claim 8, characterized in that the transparent plate is a showcase. [0011] 11. SYSTEM according to claim 3, characterized by the light source for providing the marker, providing the marker in an image displayed on a display screen of the display device. [0012] 12. SYSTEM according to claim 11, characterized in that the marker is an infrared (IR) marker. [0013] 13. SYSTEM according to claim 11, characterized in that the system comprises a liquid crystal display device that has an infrared (IR) backlight and a visible light backlight. [0014] 14. SYSTEM according to claim 1, characterized in that the light source for providing the marker (74) comprises a transparent OLED screen for providing a marker. [0015] 15. SYSTEM according to claim 1, characterized in that the system is arranged to track the gaze of more than one observer by providing a single respective additional marker for each additional observer. [0016] 16. METHOD FOR TRACKING THE POINT OF LOOKING FOR AN OBSERVER OBSERVING AN OBJECT, characterized in that it comprises the recording of an image of an observer's eye, the provision of a luminous marker (74) on or associated with the observed object, said dense marker selectively moving within an observer's field of view, and the analysis of the image recorded by the recording device to determine, from the image, a corneal reflex (q) of the marker (74) and the corneal reflex ( q) from the pupil center (p) of an observer's eye, the method further comprising: determining relative positions of the corneal reflex (q) of a marker (74) and the corneal refraction (r) of the pupil center (p) from an observer's eye, generate data to reposition the marker (74) in order to decrease the difference between the determined relative positions of the corneal reflex (q) of the marker (74) and the corneal refraction (r) of the pupil center (p), and repositioning the marker (74) in response to repositioning data. [0017] 17. METHOD according to claim 16, characterized in that the marker (74) has the form of a pattern. [0018] 18. METHOD, according to claim 16, characterized in that the marker (74) is provided in or on a transparent plate.
类似技术:
公开号 | 公开日 | 专利标题 BR112012023787B1|2021-08-10|SYSTEM FOR TRACKING THE LOOKING POINT OF AN OBJECT WATCHING AN OBJECT AND METHOD FOR TRACKING THE LOOKING POINT OF AN OBJECT WATCHING AN OBJECT RU2709389C2|2019-12-17|Display with reduced visual discomfort JP6308940B2|2018-04-11|System and method for identifying eye tracking scene reference position US20180239425A1|2018-08-23|Method for displaying virtual image, storage medium and electronic device therefor US20180160079A1|2018-06-07|Pupil detection device CN104956252B|2017-10-13|Peripheral display for near-eye display device JP4783018B2|2011-09-28|Method and apparatus for detecting and tracking the eye and its gaze direction US9135508B2|2015-09-15|Enhanced user eye gaze estimation US10117569B2|2018-11-06|Diagnosis supporting device and diagnosis supporting method CN108136258A|2018-06-08|Picture frame is adjusted based on tracking eye motion CN105992965A|2016-10-05|Stereoscopic display responsive to focal-point shift JP5879353B2|2016-03-08|Head position and orientation tracking JP2005198743A|2005-07-28|Three-dimensional view point measuring device JP6462209B2|2019-01-30|Measuring device and measuring method CN109803574A|2019-05-24|Wearable device with display, lens, luminaire and imaging sensor US10592739B2|2020-03-17|Gaze-tracking system and method of tracking user's gaze Dostal et al.2014|Estimating and using absolute and relative viewing distance in interactive systems US20210373660A1|2021-12-02|Calibration, customization, and improved user experience for bionic lenses CN111603134A|2020-09-01|Eyeball movement testing device and method CN109643152A|2019-04-16|Use face and the eyes tracking of the face sensor in head-mounted display and FA Facial Animation US10726257B2|2020-07-28|Gaze-tracking system and method of tracking user's gaze US20210153794A1|2021-05-27|Evaluation apparatus, evaluation method, and evaluation program US10706600B1|2020-07-07|Head-mounted display devices with transparent display panels for color deficient user KR20210079774A|2021-06-30|Wearable device including eye tracking apparatus and operating method thereof JP6870474B2|2021-05-12|Gaze detection computer program, gaze detection device and gaze detection method
同族专利:
公开号 | 公开日 JP5816257B2|2015-11-18| US20130002846A1|2013-01-03| CN102802502A|2012-11-28| JP2013528406A|2013-07-11| RU2565482C2|2015-10-20| RU2012144651A|2014-04-27| WO2011117776A1|2011-09-29| CN102802502B|2016-03-30| US9237844B2|2016-01-19| EP2549914B1|2019-06-05| BR112012023787A2|2017-12-12| EP2549914A1|2013-01-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPH0566133B2|1985-01-28|1993-09-21|Topcon Corp| US4973149A|1987-08-19|1990-11-27|Center For Innovative Technology|Eye movement detector| US5204703A|1991-06-11|1993-04-20|The Center For Innovative Technology|Eye movement and pupil diameter apparatus and method| US6757422B1|1998-11-12|2004-06-29|Canon Kabushiki Kaisha|Viewpoint position detection apparatus and method, and stereoscopic image display system| JP2000312664A|1999-04-30|2000-11-14|Canon Inc|Device for detecting line of sight| US6578962B1|2001-04-27|2003-06-17|International Business Machines Corporation|Calibration-free eye gaze tracking| GB0119859D0|2001-08-15|2001-10-10|Qinetiq Ltd|Eye tracking system| US6659611B2|2001-12-28|2003-12-09|International Business Machines Corporation|System and method for eye gaze tracking using corneal image mapping| US7306337B2|2003-03-06|2007-12-11|Rensselaer Polytechnic Institute|Calibration-free gaze tracking under natural head movement| EP1691670B1|2003-11-14|2014-07-16|Queen's University At Kingston|Method and apparatus for calibration-free eye tracking| US7963652B2|2003-11-14|2011-06-21|Queen's University At Kingston|Method and apparatus for calibration-free eye tracking| US7465050B2|2004-02-04|2008-12-16|The Johns Hopkins University|Method and apparatus for three-dimensional video-oculography| US8483567B2|2004-04-09|2013-07-09|Immediate Response Technologies, Inc|Infrared communication system and method| JP2006029441A|2004-07-15|2006-02-02|Nissan Motor Co Ltd|Position detecting device of shift fork| US7331929B2|2004-10-01|2008-02-19|General Electric Company|Method and apparatus for surgical operating room information display gaze detection and user prioritization for control| US7573439B2|2004-11-24|2009-08-11|General Electric Company|System and method for significant image selection using visual tracking| US7576757B2|2004-11-24|2009-08-18|General Electric Company|System and method for generating most read images in a PACS workstation| WO2006101943A2|2005-03-16|2006-09-28|Lc Technologies, Inc.|System and method for eyeball surface topography as a biometric discriminator| WO2006108017A2|2005-04-04|2006-10-12|Lc Technologies, Inc.|Explicit raytracing for gimbal-based gazepoint trackers| JP2007029126A|2005-07-22|2007-02-08|System Artware:Kk|Line-of-sight detecting device| US7529772B2|2005-09-27|2009-05-05|Scenera Technologies, Llc|Method and system for associating user comments to a scene captured by a digital imaging device| JP5510951B2|2006-02-01|2014-06-04|トビーテクノロジーアーベー|Generating graphical feedback in computer systems| JP5167545B2|2006-03-31|2013-03-21|国立大学法人静岡大学|Viewpoint detection device| KR100820639B1|2006-07-25|2008-04-10|한국과학기술연구원|System and method for 3-dimensional interaction based on gaze and system and method for tracking 3-dimensional gaze| US8077914B1|2006-08-07|2011-12-13|Arkady Kaplan|Optical tracking apparatus using six degrees of freedom| US7986816B1|2006-09-27|2011-07-26|University Of Alaska|Methods and systems for multiple factor authentication using gaze tracking and iris scanning| US7548677B2|2006-10-12|2009-06-16|Microsoft Corporation|Interactive display using planar radiation guide| CN101172034A|2006-11-03|2008-05-07|上海迪康医学生物技术有限公司|Eyeball moving track detecting method| US8036917B2|2006-11-22|2011-10-11|General Electric Company|Methods and systems for creation of hanging protocols using eye tracking and voice command and control| SE0602545L|2006-11-29|2008-05-30|Tobii Technology Ab|Eye tracking illumination| US7808540B2|2007-01-09|2010-10-05|Eastman Kodak Company|Image capture and integrated display apparatus| EP2131721A4|2007-03-08|2010-12-01|Univ Northern British Columbia|Apparatus and method for objective perimetry visual field test| RU2352244C2|2007-05-14|2009-04-20|Государственное образовательное учреждение высшего профессионального образования Курский государственный технический университет|Method of measurement of fast movements of eyes and deviations of solid vision and device for its realisation| WO2008141460A1|2007-05-23|2008-11-27|The University Of British Columbia|Methods and apparatus for estimating point-of-gaze in three dimensions| JP4411387B2|2007-08-07|2010-02-10|学校法人大阪電気通信大学|Moving body detection apparatus, moving body detection method, pointing device, computer program, and storage medium| US7789510B2|2008-01-14|2010-09-07|Sina Fateh|System and method for improving the peripheral vision of a subject| US20090245696A1|2008-03-31|2009-10-01|Sharp Laboratories Of America, Inc.|Method and apparatus for building compound-eye seeing displays| WO2009124601A1|2008-04-11|2009-10-15|Ecole Polytechnique Federale De Lausanne Epfl|Time-of-flight based imaging system using a display as illumination source| US8204340B2|2008-09-29|2012-06-19|Two Pic Mc Llc|Methods and apparatus for dot marker matching| US20100128118A1|2008-11-26|2010-05-27|Locarna Systems, Inc.|Identification of visual fixations in a video stream| JP4699536B2|2009-03-06|2011-06-15|シャープ株式会社|POSITION DETECTION DEVICE, CONTROL METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM| ES2880475T3|2009-04-01|2021-11-24|Tobii Ab|Visual representation system with illuminators for gaze tracking| AU2011220382A1|2010-02-28|2012-10-18|Microsoft Corporation|Local advertising content on an interactive head-mounted eyepiece| US20120127140A1|2010-11-19|2012-05-24|John Ryan|Multi-mode liquid crystal display with auxiliary non-display components| EP2499963A1|2011-03-18|2012-09-19|SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH|Method and apparatus for gaze point mapping| US8632182B2|2011-05-05|2014-01-21|Sony Computer Entertainment Inc.|Interface using eye tracking contact lenses| US8749396B2|2011-08-25|2014-06-10|Satorius Stedim Biotech Gmbh|Assembling method, monitoring method, communication method, augmented reality system and computer program product| CN103105926A|2011-10-17|2013-05-15|微软公司|Multi-sensor posture recognition| US8937591B2|2012-04-06|2015-01-20|Apple Inc.|Systems and methods for counteracting a perceptual fading of a movable indicator| US8854447B2|2012-12-21|2014-10-07|United Video Properties, Inc.|Systems and methods for automatically adjusting audio based on gaze point| US10163049B2|2013-03-08|2018-12-25|Microsoft Technology Licensing, Llc|Inconspicuous tag for generating augmented reality experiences|EP2583619B1|2011-10-22|2022-03-16|Alcon Inc.|Apparatus for monitoring one or more surgical parameters of the eye| EP2812775A1|2012-02-06|2014-12-17|Sony Mobile Communications AB|Gaze tracking with projector| JP6234383B2|2012-02-27|2017-11-22|エー・テー・ハー・チューリッヒEth Zuerich|Method and system for image processing for gaze correction in video conferencing| EP2826414B1|2012-07-31|2016-11-30|Japan Science and Technology Agency|Point-of-gaze detection device, point-of-gaze detection method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium| US8571851B1|2012-12-31|2013-10-29|Google Inc.|Semantic interpretation using user gaze order| US9699433B2|2013-01-24|2017-07-04|Yuchen Zhou|Method and apparatus to produce re-focusable vision with detecting re-focusing event from human eye| US9179833B2|2013-02-28|2015-11-10|Carl Zeiss Meditec, Inc.|Systems and methods for improved ease and accuracy of gaze tracking| KR102012254B1|2013-04-23|2019-08-21|한국전자통신연구원|Method for tracking user's gaze position using mobile terminal and apparatus thereof| US9582075B2|2013-07-19|2017-02-28|Nvidia Corporation|Gaze-tracking eye illumination from display| CN103366381A|2013-08-06|2013-10-23|山东大学|Sight line tracking correcting method based on space position| KR102037417B1|2013-08-13|2019-10-28|삼성전자주식회사|Method of capturing an iris image, Computer readable storage medium of recording the method and an iris image capture device| US9958939B2|2013-10-31|2018-05-01|Sync-Think, Inc.|System and method for dynamic content delivery based on gaze analytics| US10180716B2|2013-12-20|2019-01-15|LenovoPte Ltd|Providing last known browsing location cue using movement-oriented biometric data| US9298007B2|2014-01-21|2016-03-29|Osterhout Group, Inc.|Eye imaging in head worn computing| US9836122B2|2014-01-21|2017-12-05|Osterhout Group, Inc.|Eye glint imaging in see-through computer display systems| US20150302252A1|2014-04-16|2015-10-22|Lucas A. Herrera|Authentication method using multi-factor eye gaze| CN106662911B|2014-04-29|2020-08-11|惠普发展公司,有限责任合伙企业|Gaze detector using reference frames in media| KR102279681B1|2014-05-26|2021-07-20|에스케이플래닛 주식회사|Apparatus and method for providing advertisement using pupil recognition| KR20150144185A|2014-06-16|2015-12-24|현대자동차주식회사|Method for extracting eye center point| US9958947B2|2014-06-25|2018-05-01|Comcast Cable Communications, Llc|Ocular focus sharing for digital content| JP2017536873A|2014-10-23|2017-12-14|コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V.|Region of interest segmentation by gaze tracking drive| KR102317237B1|2014-11-03|2021-10-26|현대모비스 주식회사|Apparatus for driving impairment handling| CN104352243B|2014-11-26|2016-09-21|首都医科大学附属北京朝阳医院|Measure the audiometric systems of sound localization ability| EP3230825B1|2014-12-10|2019-05-29|Telefonaktiebolaget LM Ericsson |Device for and method of corneal imaging| EP3234737B1|2014-12-16|2019-04-10|Koninklijke Philips N.V.|Gaze tracking system with calibration improvement, accuracy compensation, and gaze localization smoothing| KR102347359B1|2015-01-14|2022-01-06|삼성전자주식회사|Electronic device and method for tracking gaze in electronic device| US9541998B2|2015-01-29|2017-01-10|Samsung Electronics Co., Ltd.|Electronic system with gaze alignment mechanism and method of operation thereof| DK179537B1|2015-02-04|2019-02-08|Itu Business Development A/S|Tin traces and eye tracking methods| WO2016151581A1|2015-03-23|2016-09-29|Controlrad Systems Inc.|Eye tracking system| US20180129279A1|2015-04-08|2018-05-10|Controlrad Systems Inc.|Devices And Methods For Monitoring Gaze| US20180188830A1|2015-06-19|2018-07-05|Lg Electronics Inc.|Electronic device| US10977865B2|2015-08-04|2021-04-13|Seyed-Nima Yasrebi|Augmented reality in vehicle platforms| US9829976B2|2015-08-07|2017-11-28|Tobii Ab|Gaze direction mapping| JP6334477B2|2015-08-24|2018-05-30|Necフィールディング株式会社|Image display device, image display method, and program| US10061383B1|2015-09-16|2018-08-28|Mirametrix Inc.|Multi-feature gaze tracking system and method| KR101745140B1|2015-09-21|2017-06-08|현대자동차주식회사|Gaze tracker and method for tracking graze thereof| US9785234B2|2015-12-26|2017-10-10|Intel Corporation|Analysis of ambient light for gaze tracking| EP3434069B1|2016-03-21|2020-02-19|Koninklijke Philips N.V.|An adaptive lighting system for a mirror component and a method of controlling an adaptive lighting system| WO2017172911A1|2016-03-29|2017-10-05|Google Inc.|System and method for generating virtual marks based on gaze tracking| EP3242228A1|2016-05-02|2017-11-08|Artag SARL|Managing the display of assets in augmented reality mode| CN109828389B|2016-06-13|2020-06-09|浙江智航芯电子科技有限公司|Navigation system| US9972134B2|2016-06-30|2018-05-15|Microsoft Technology Licensing, Llc|Adaptive smoothing based on user focus on a target object| CN106339087B|2016-08-29|2019-01-29|上海青研科技有限公司|A kind of eyeball tracking method and device thereof based on multidimensional coordinate| EP3305176A1|2016-10-04|2018-04-11|Essilor International|Method for determining a geometrical parameter of an eye of a subject| US9898082B1|2016-11-01|2018-02-20|Massachusetts Institute Of Technology|Methods and apparatus for eye tracking| US10181078B1|2016-11-07|2019-01-15|National Technology & Engineering Solutions Of Sandia, Llc|Analysis and categorization of eye tracking data describing scanpaths| CA3042263A1|2016-11-10|2018-05-17|Magic Leap, Inc.|Method and system for eye tracking using speckle patterns| CN108664118B|2017-03-27|2021-06-15|腾讯科技(深圳)有限公司|Eyeball tracking method and device, contact lenses and virtual reality system| CN107105215B|2017-03-28|2020-02-21|联想有限公司|Method and display system for presenting image| LU100354B1|2017-08-01|2019-02-06|Starbreeze Ip Lux Ii S A R L|Eye tracking system for head-mounted display| US10768696B2|2017-10-05|2020-09-08|Microsoft Technology Licensing, Llc|Eye gaze correction using pursuit vector| KR102016308B1|2017-11-17|2019-08-30|포인드 주식회사|Eye Tracking Method Using Movement of Pupil and Gap between Eyes and Edge of Face| RU2696042C2|2017-12-11|2019-07-30|Федеральное государственное бюджетное образовательное учреждение высшего образования "Московский государственный университет имени М.В. Ломоносова" |Method and system for recording eye movement| US20190179165A1|2017-12-12|2019-06-13|RaayonNova, LLC|Smart Contact Lens with Embedded Display and Image Focusing System| WO2019122808A1|2017-12-21|2019-06-27|Bae Systems Plc|Eye tracking for head-worn display| TWI647472B|2018-01-22|2019-01-11|國立臺灣大學|Dual mode line of sight tracking method and system| EP3750001A1|2018-03-07|2020-12-16|Nokia Technologies Oy|An apparatus for use in a near eye display| US10324529B1|2018-05-31|2019-06-18|Tobii Ab|Method and system for glint/reflection identification| CN109343700B|2018-08-31|2020-10-27|深圳市沃特沃德股份有限公司|Eye movement control calibration data acquisition method and device| CN109464118B|2018-12-10|2021-07-27|西安米克跨界信息科技有限公司|Vision detection and glasses preparation method based on mobile phone| DE102019114004A1|2019-05-24|2020-11-26|Bayerische Motoren Werke Aktiengesellschaft|Display device for a motor vehicle and method for its operation| DE102019114005A1|2019-05-24|2020-11-26|Bayerische Motoren Werke Aktiengesellschaft|Liquid crystal display device for a motor vehicle and method for its operation| CN113815623A|2020-06-11|2021-12-21|广州汽车集团股份有限公司|Method for visually tracking human eye fixation point, vehicle early warning method and device| RU2739519C1|2020-06-26|2020-12-25|Общество С Ограниченной Ответственностью "Тотал Вижен" |Device for displaying images in certain range of space, spaced from observer's sight line by given angle, and fixing response to that image|
法律状态:
2018-03-06| B25D| Requested change of name of applicant approved|Owner name: KONINKLIJKE PHILIPS N.V. (NL) | 2018-03-20| B25G| Requested change of headquarter approved|Owner name: KONINKLIJKE PHILIPS N.V. (NL) | 2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-12-10| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-05-25| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-08-03| B350| Update of information on the portal [chapter 15.35 patent gazette]| 2021-08-10| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 15/03/2011, OBSERVADAS AS CONDICOES LEGAIS. PATENTE CONCEDIDA CONFORME ADI 5.529/DF, QUE DETERMINA A ALTERACAO DO PRAZO DE CONCESSAO. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 EP10157169.3|2010-03-22| EP10157169|2010-03-22| PCT/IB2011/051076|WO2011117776A1|2010-03-22|2011-03-15|System and method for tracking the point of gaze of an observer| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|